23,649 research outputs found

    Entanglement and optimal strings of qubits for memory channels

    Get PDF
    We investigate the problem of enhancement of mutual information by encoding classical data into entangled input states of arbitrary length and show that while there is a threshold memory or correlation parameter beyond which entangled states outperform the separable states, resulting in a higher mutual information, this memory threshold increases toward unity as the length of the string increases. These observations imply that encoding classical data into entangled states may not enhance the classical capacity of quantum channels.Comment: 14 pages, 8 figures, latex, accepted for publication in Physical Review

    Classical Statistics Inherent in a Quantum Density Matrix

    Full text link
    A density matrix formulation of classical bipartite correlations is constructed. This leads to an understanding of the appearance of classical statistical correlations intertwined with the quantum correlations as well as a physical underpinning of these correlations. As a byproduct of this analysis, a physical basis of the classical statistical correlations leading to additive entropy in a bipartite system discussed recently by Tsallis et al emerges as inherent classical spin fluctuations. It is found that in this example, the quantum correlations shrink the region of additivity in phase space.Comment: 10 pages, 3 figure

    LISA Source Confusion

    Full text link
    The Laser Interferometer Space Antenna (LISA) will detect thousands of gravitational wave sources. Many of these sources will be overlapping in the sense that their signals will have a non-zero cross-correlation. Such overlaps lead to source confusion, which adversely affects how well we can extract information about the individual sources. Here we study how source confusion impacts parameter estimation for galactic compact binaries, with emphasis on the effects of the number of overlaping sources, the time of observation, the gravitational wave frequencies of the sources, and the degree of the signal correlations. Our main findings are that the parameter resolution decays exponentially with the number of overlapping sources, and super-exponentially with the degree of cross-correlation. We also find that an extended mission lifetime is key to disentangling the source confusion as the parameter resolution for overlapping sources improves much faster than the usual square root of the observation time.Comment: 8 pages, 14 figure

    Time's Barbed Arrow: Irreversibility, Crypticity, and Stored Information

    Full text link
    We show why the amount of information communicated between the past and future--the excess entropy--is not in general the amount of information stored in the present--the statistical complexity. This is a puzzle, and a long-standing one, since the latter is what is required for optimal prediction, but the former describes observed behavior. We layout a classification scheme for dynamical systems and stochastic processes that determines when these two quantities are the same or different. We do this by developing closed-form expressions for the excess entropy in terms of optimal causal predictors and retrodictors--the epsilon-machines of computational mechanics. A process's causal irreversibility and crypticity are key determining properties.Comment: 4 pages, 2 figure

    Configurational entropy of network-forming materials

    Full text link
    We present a computationally efficient method to calculate the configurational entropy of network-forming materials. The method requires only the atomic coordinates and bonds of a single well-relaxed configuration. This is in contrast to the multiple simulations that are required for other methods to determine entropy, such as thermodynamic integration. We use our method to obtain the configurational entropy of well-relaxed networks of amorphous silicon and vitreous silica. For these materials we find configurational entropies of 1.02 kb and 0.97 kb per silicon atom, respectively, with kb the Boltzmann constant.Comment: 4 pages, 4 figure

    Statistical mechanics of lossy compression using multilayer perceptrons

    Full text link
    Statistical mechanics is applied to lossy compression using multilayer perceptrons for unbiased Boolean messages. We utilize a tree-like committee machine (committee tree) and tree-like parity machine (parity tree) whose transfer functions are monotonic. For compression using committee tree, a lower bound of achievable distortion becomes small as the number of hidden units K increases. However, it cannot reach the Shannon bound even where K -> infty. For a compression using a parity tree with K >= 2 hidden units, the rate distortion function, which is known as the theoretical limit for compression, is derived where the code length becomes infinity.Comment: 12 pages, 5 figure

    Entanglement can completely defeat quantum noise

    Get PDF
    We describe two quantum channels that individually cannot send any information, even classical, without some chance of decoding error. But together a single use of each channel can send quantum information perfectly reliably. This proves that the zero-error classical capacity exhibits superactivation, the extreme form of the superadditivity phenomenon in which entangled inputs allow communication over zero capacity channels. But our result is stronger still, as it even allows zero-error quantum communication when the two channels are combined. Thus our result shows a new remarkable way in which entanglement across two systems can be used to resist noise, in this case perfectly. We also show a new form of superactivation by entanglement shared between sender and receiver.Comment: 4 pages, 1 figur

    The Minimum Description Length Principle and Model Selection in Spectropolarimetry

    Get PDF
    It is shown that the two-part Minimum Description Length Principle can be used to discriminate among different models that can explain a given observed dataset. The description length is chosen to be the sum of the lengths of the message needed to encode the model plus the message needed to encode the data when the model is applied to the dataset. It is verified that the proposed principle can efficiently distinguish the model that correctly fits the observations while avoiding over-fitting. The capabilities of this criterion are shown in two simple problems for the analysis of observed spectropolarimetric signals. The first is the de-noising of observations with the aid of the PCA technique. The second is the selection of the optimal number of parameters in LTE inversions. We propose this criterion as a quantitative approach for distinguising the most plausible model among a set of proposed models. This quantity is very easy to implement as an additional output on the existing inversion codes.Comment: Accepted for publication in the Astrophysical Journa

    Information Content of Polarization Measurements

    Full text link
    Information entropy is applied to the state of knowledge of reaction amplitudes in pseudoscalar meson photoproduction, and a scheme is developed that quantifies the information content of a measured set of polarization observables. It is shown that this definition of information is a more practical measure of the quality of a set of measured observables than whether the combination is a mathematically complete set. It is also shown that when experimental uncertainty is introduced, complete sets of measurements do not necessarily remove ambiguities, and that experiments should strive to measure as many observables as practical in order to extract amplitudes.Comment: 19 pages, 4 figures; figures updated, minor textual correction
    corecore